11 research outputs found
Recommended from our members
Exploration of mid-air haptics experience design
Ultrasonic Mid-air Haptics (UMH) is a novel technology that uses the mechanical properties of sound waves to create a pressure point in mid-air. This pressure point, called focal point, can slightly bend the skin and be felt in mid-air without any attachment to the body. This thesis focuses on both studying how to integrate this technology with other senses (i.e. vision and audition) and exploring the range of tactile sensations it can provide.
The first two projects presented in this document present the integration of ultrasonic mid-air haptics with audio-visual content. The first project describes the process of creating a unique haptic experience that was part of a six-weeks multisensory exhibition in a museum. The second project moved from the museum to a controlled environment and explored the creation of haptic experiences based on physiologic measurements for six short films. Both studies showed the positive value of adding ultrasonic mid-air haptics to traditional media through higher reported arousal and participants’ high enthusiasm for multisensory content.
In the two latter projects of this thesis, it was explored how we could extend the range of possible tactile sensations provided by UMHs. We introduced a new technique called Spatio-Temporal Modulation (STM). It enabled the creation of brand-new tactile experiences, including more salient shapes and wider range of textures. We also provided some guidelines on how to control some of the tactile properties of the sensation, including strength,roughness,or regularity.
The findings of those four projects contribute to the growing body of knowledge of UMHs. A summary of the key contributions is provided at the end of the thesis as well as several leads for future works
Creating an illusion of movement between the hands using mid-air touch
Apparent tactile motion (ATM) has been shown to occur across many contiguous parts of the body, such as fingers, forearms and the back. More recently, the illusion has also been elicited on non-contiguous part of the body, such as from one hand to the other when interconnected or not interconnected by an object in between the hands. Here we explore the reproducibility of the intermanual tactile illusion of movement between two free hands by employing mid-air tactile stimulation. We investigate the optimal parameters to generate a continuous and smooth motion using two arrays of ultrasound speakers, and two stimulation techniques (i.e. static vs. dynamic focal point). In the first experiment, we investigate the occurrence of the illusion when using a static focal point, and we define a perceptive model. In the second experiment, we examine the illusion using a dynamic focal point, defining a second perceptive model. Finally, we discuss the differences between the two techniques
Recommended from our members
Integrating mid-air haptics into movie experiences
`Seeing is believing, but feeling is the truth''. This idiom from the seventeenth century English clergyman Thomas Fuller gains new momentum in light of an increased proliferation of haptic technologies that allow people to have various kinds of `touch' and `touchless' interactions. Here, we report on the process of creating and integrating touchless feedback (i.e. mid-air haptic stimuli) into short movie experiences (i.e. one-minute movie format). Based on a systematic evaluation of user's experiences of those haptically enhanced movies, we show evidence for the positive effect of haptic feedback during the first viewing experience, but also for a repeated viewing after two weeks. This opens up a promising design space for content creators and researchers interested in sensory augmentation of audiovisual content. We discuss our findings and the use of mid-air haptics technologies with respect to its effect on users' emotions, changes in the viewing experience over time, and the effects of synchronisation
Recommended from our members
Gustatory interface: the challenges of ‘how’ to stimulate the sense of taste
Gustatory interfaces have gained popularity in the field of human computer interaction, especially in the context of augmenting gaming and virtual reality experiences, but also in the context of food interaction design enabling the creation of new eating experiences. In this paper, we first review prior works on gustatory interfaces and particularly discuss them based on the use of either a chemical, electrical and/or thermal stimulation approach. We then present two concepts for gustatory interfaces that represent a more traditional delivery approach (using a mouthpiece) versus a novel approach that is based on principles of acoustic levitation (contactless delivery).We discuss the design opportunities around those two concepts in particular to overcome challenges of "how" to stimulate the sense of taste
LeviSense: a platform for the multisensory integration in levitating food and insights into its effect on flavour perception
Eating is one of the most multisensory experiences in everyday life. All of our five senses (i.e. taste, smell, vision, hearing and touch) are involved, even if we are not aware of it. However, while multisensory integration has been well studied in psychology, there is not a single platform for testing systematically the effects of different stimuli. This lack of platform results in unresolved design challenges for the design of taste-based immersive experiences. Here, we present LeviSense: the first system designed for multisensory integration in gustatory experiences based on levitated food. Our system enables the systematic exploration of different sensory effects on eating experiences. It also opens up new opportunities for other professionals (e.g., molecular gastronomy chefs) looking for innovative taste-delivery platforms. We describe the design process behind LeviSense and conduct two experiments to test a subset of the crossmodal combinations (i.e., taste and vision, taste and smell). Our results show how different lighting and smell conditions affect the perceived taste intensity, pleasantness, and satisfaction. We discuss how LeviSense creates a new technical, creative, and expressive possibilities in a series of emerging design spaces within Human-Food Interaction
Recommended from our members
Sampling strategy for ultrasonic mid-air haptics
Mid-air tactile stimulation using ultrasonics has been used in a variety of human computer interfaces in the form of prototypes as well as products. When generating these tactile patterns with mid-air tactile ultrasonic displays, the common approach has been to sample the patterns using the hardware update rate capabilities to their full extent. In the current study we show that the hardware update rate can impact perception, but unexpectedly we find that higher update rates do not improve pattern perception. In a first user study, we highlight the effect of update rate on the perceived strength of a pattern, especially for patterns rendered at slow rate of less than 10 Hz. In a second user study, we identify the evolution of the optimal update rate according to variations in pattern size. Our main results show that update rate should be designated as additional parameter for tactile patterns. We also discuss how the relationships we defined in the current study can be implemented into designer tools so that designers remain oblivious to this additional complexity
Recommended from our members
Not just seeing, but also feeling art: mid-air haptic experiences integrated in a multisensory art exhibition
The use of the senses of vision and audition as interactive means has dominated the field of Human-Computer Interaction (HCI) for decades, even though nature has provided us with many more senses for perceiving and interacting with the world around us. That said, it has become attractive for {HCI} researchers and designers to harness touch, taste, and smell in interactive tasks and experience design. In this paper, we present research and design insights gained throughout an interdisciplinary collaboration on a six-week multisensory display – Tate Sensorium – exhibited at the Tate Britain art gallery in London, UK. This is a unique and first time case study on how to design art experiences whilst considering all the senses (i.e., vision, sound, touch, smell, and taste), in particular touch, which we exploited by capitalizing on a novel haptic technology, namely, mid-air haptics. We first describe the overall set up of Tate Sensorium and then move on to describing in detail the design process of the mid-air haptic feedback and its integration with sound for the Full Stop painting by John Latham (1961). This was the first time that mid-air haptic technology was used in a museum context over a prolonged period of time and integrated with sound to enhance the experience of visual art. As part of an interdisciplinary team of curators, sensory designers, sound artists, we selected a total of three variations of the mid-air haptic experience (i.e., haptic patterns), which were alternated at dedicated times throughout the six-week exhibition. We collected questionnaire-based feedback from 2500 visitors and conducted 50 interviews to gain quantitative and qualitative insights on visitors’ experiences and emotional reactions. Whilst the questionnaire results are generally very positive with only a small variation of the visitors’ arousal ratings across the three tactile experiences designed for the Full Stop painting, the interview data shed light on the differences in the visitors’ subjective experiences. Our findings suggest multisensory designers and art curators can ensure a balance between surprising experiences versus the possibility of free exploration for visitors. In addition, participants expressed that experiencing art with the combination of mid-air haptic and sound was immersive and provided an up-lifting experience of touching without touch. We are convinced that the insights gained from this large-scale and real-world field exploration of multisensory experience design exploiting a new and emerging technology provide a solid starting point for the {HCI} community, creative industries, and art curators to think beyond conventional art experiences. Specifically, our work demonstrates how novel mid-air technology can make art more emotionally engaging and stimulating, especially abstract art that is often open to interpretation
The how and why behind a multisensory art display
Designing multisensory experiences has always fascinated artists and scientists alike. In recent years, there has been a growing interest in multisensory experience design within the HCI community [1]. Next to advances in haptic technologies, we see novel work on olfactory and gustatory systems [2,3] and efforts in determining multisensory design spaces [4]. Moreover, artists, museum curators, and creative industries are interested in those emerging technologies for their own work. Here we present Tate Sensorium, a multisensory art display, as an example case for multisensory design
TastyFloats: a contactless food delivery system
We present two realizations of TastyFloats, a novel system that uses acoustic levitation to deliver food morsels to the users’ tongue. To explore TastyFloats’ associated design framework, we first address the technical challenges to successfully levitate and deliver different types of foods on the tongue. We then conduct a user study, assessing the effect of acoustic levitation on users’ taste perception, comparing three basic taste stimuli (i.e., sweet, bitter and umami) and three volume sizes of droplets (5µL, 10µL and 20µL). Our results show that users perceive sweet and umami easily, even in minimal quantities, whereas bitter is the least detectable taste, despite its typical association with an unpleasant taste experience. Our results are a first step towards the creation of new culinary experiences and innovative gustatory interfaces